Generated for model: models/resnet50_prune10pct_best_model.pth
----------------------------------------------------------------
Layer (type) Output Shape Param #
================================================================
Conv2d-1 [-1, 64, 112, 112] 9,408
BatchNorm2d-2 [-1, 64, 112, 112] 128
MaxPool2d-3 [-1, 64, 56, 56] 0
Conv2d-4 [-1, 64, 56, 56] 4,096
BatchNorm2d-5 [-1, 64, 56, 56] 128
Conv2d-6 [-1, 64, 56, 56] 36,864
BatchNorm2d-7 [-1, 64, 56, 56] 128
Conv2d-8 [-1, 256, 56, 56] 16,384
BatchNorm2d-9 [-1, 256, 56, 56] 512
Conv2d-10 [-1, 256, 56, 56] 16,384
BatchNorm2d-11 [-1, 256, 56, 56] 512
Bottleneck-12 [-1, 256, 56, 56] 0
Conv2d-13 [-1, 64, 56, 56] 16,384
BatchNorm2d-14 [-1, 64, 56, 56] 128
Conv2d-15 [-1, 64, 56, 56] 36,864
BatchNorm2d-16 [-1, 64, 56, 56] 128
Conv2d-17 [-1, 256, 56, 56] 16,384
BatchNorm2d-18 [-1, 256, 56, 56] 512
Bottleneck-19 [-1, 256, 56, 56] 0
Conv2d-20 [-1, 64, 56, 56] 16,384
BatchNorm2d-21 [-1, 64, 56, 56] 128
Conv2d-22 [-1, 64, 56, 56] 36,864
BatchNorm2d-23 [-1, 64, 56, 56] 128
Conv2d-24 [-1, 256, 56, 56] 16,384
BatchNorm2d-25 [-1, 256, 56, 56] 512
Bottleneck-26 [-1, 256, 56, 56] 0
Conv2d-27 [-1, 128, 56, 56] 32,768
BatchNorm2d-28 [-1, 128, 56, 56] 256
Conv2d-29 [-1, 128, 28, 28] 147,456
BatchNorm2d-30 [-1, 128, 28, 28] 256
Conv2d-31 [-1, 512, 28, 28] 65,536
BatchNorm2d-32 [-1, 512, 28, 28] 1,024
Conv2d-33 [-1, 512, 28, 28] 131,072
BatchNorm2d-34 [-1, 512, 28, 28] 1,024
Bottleneck-35 [-1, 512, 28, 28] 0
Conv2d-36 [-1, 128, 28, 28] 65,536
BatchNorm2d-37 [-1, 128, 28, 28] 256
Conv2d-38 [-1, 128, 28, 28] 147,456
BatchNorm2d-39 [-1, 128, 28, 28] 256
Conv2d-40 [-1, 512, 28, 28] 65,536
BatchNorm2d-41 [-1, 512, 28, 28] 1,024
Bottleneck-42 [-1, 512, 28, 28] 0
Conv2d-43 [-1, 128, 28, 28] 65,536
BatchNorm2d-44 [-1, 128, 28, 28] 256
Conv2d-45 [-1, 128, 28, 28] 147,456
BatchNorm2d-46 [-1, 128, 28, 28] 256
Conv2d-47 [-1, 512, 28, 28] 65,536
BatchNorm2d-48 [-1, 512, 28, 28] 1,024
Bottleneck-49 [-1, 512, 28, 28] 0
Conv2d-50 [-1, 128, 28, 28] 65,536
BatchNorm2d-51 [-1, 128, 28, 28] 256
Conv2d-52 [-1, 128, 28, 28] 147,456
BatchNorm2d-53 [-1, 128, 28, 28] 256
Conv2d-54 [-1, 512, 28, 28] 65,536
BatchNorm2d-55 [-1, 512, 28, 28] 1,024
Bottleneck-56 [-1, 512, 28, 28] 0
Conv2d-57 [-1, 256, 28, 28] 131,072
BatchNorm2d-58 [-1, 256, 28, 28] 512
Conv2d-59 [-1, 256, 14, 14] 589,824
BatchNorm2d-60 [-1, 256, 14, 14] 512
Conv2d-61 [-1, 1024, 14, 14] 262,144
BatchNorm2d-62 [-1, 1024, 14, 14] 2,048
Conv2d-63 [-1, 1024, 14, 14] 524,288
BatchNorm2d-64 [-1, 1024, 14, 14] 2,048
Bottleneck-65 [-1, 1024, 14, 14] 0
Conv2d-66 [-1, 256, 14, 14] 262,144
BatchNorm2d-67 [-1, 256, 14, 14] 512
Conv2d-68 [-1, 256, 14, 14] 589,824
BatchNorm2d-69 [-1, 256, 14, 14] 512
Conv2d-70 [-1, 1024, 14, 14] 262,144
BatchNorm2d-71 [-1, 1024, 14, 14] 2,048
Bottleneck-72 [-1, 1024, 14, 14] 0
Conv2d-73 [-1, 256, 14, 14] 262,144
BatchNorm2d-74 [-1, 256, 14, 14] 512
Conv2d-75 [-1, 256, 14, 14] 589,824
BatchNorm2d-76 [-1, 256, 14, 14] 512
Conv2d-77 [-1, 1024, 14, 14] 262,144
BatchNorm2d-78 [-1, 1024, 14, 14] 2,048
Bottleneck-79 [-1, 1024, 14, 14] 0
Conv2d-80 [-1, 256, 14, 14] 262,144
BatchNorm2d-81 [-1, 256, 14, 14] 512
Conv2d-82 [-1, 256, 14, 14] 589,824
BatchNorm2d-83 [-1, 256, 14, 14] 512
Conv2d-84 [-1, 1024, 14, 14] 262,144
BatchNorm2d-85 [-1, 1024, 14, 14] 2,048
Bottleneck-86 [-1, 1024, 14, 14] 0
Conv2d-87 [-1, 256, 14, 14] 262,144
BatchNorm2d-88 [-1, 256, 14, 14] 512
Conv2d-89 [-1, 256, 14, 14] 589,824
BatchNorm2d-90 [-1, 256, 14, 14] 512
Conv2d-91 [-1, 1024, 14, 14] 262,144
BatchNorm2d-92 [-1, 1024, 14, 14] 2,048
Bottleneck-93 [-1, 1024, 14, 14] 0
Conv2d-94 [-1, 256, 14, 14] 262,144
BatchNorm2d-95 [-1, 256, 14, 14] 512
Conv2d-96 [-1, 256, 14, 14] 589,824
BatchNorm2d-97 [-1, 256, 14, 14] 512
Conv2d-98 [-1, 1024, 14, 14] 262,144
BatchNorm2d-99 [-1, 1024, 14, 14] 2,048
Bottleneck-100 [-1, 1024, 14, 14] 0
Conv2d-101 [-1, 512, 14, 14] 524,288
BatchNorm2d-102 [-1, 512, 14, 14] 1,024
Conv2d-103 [-1, 512, 7, 7] 2,359,296
BatchNorm2d-104 [-1, 512, 7, 7] 1,024
Conv2d-105 [-1, 2048, 7, 7] 1,048,576
BatchNorm2d-106 [-1, 2048, 7, 7] 4,096
Conv2d-107 [-1, 2048, 7, 7] 2,097,152
BatchNorm2d-108 [-1, 2048, 7, 7] 4,096
Bottleneck-109 [-1, 2048, 7, 7] 0
Conv2d-110 [-1, 512, 7, 7] 1,048,576
BatchNorm2d-111 [-1, 512, 7, 7] 1,024
Conv2d-112 [-1, 512, 7, 7] 2,359,296
BatchNorm2d-113 [-1, 512, 7, 7] 1,024
Conv2d-114 [-1, 2048, 7, 7] 1,048,576
BatchNorm2d-115 [-1, 2048, 7, 7] 4,096
Bottleneck-116 [-1, 2048, 7, 7] 0
Conv2d-117 [-1, 512, 7, 7] 1,048,576
BatchNorm2d-118 [-1, 512, 7, 7] 1,024
Conv2d-119 [-1, 512, 7, 7] 2,359,296
BatchNorm2d-120 [-1, 512, 7, 7] 1,024
Conv2d-121 [-1, 2048, 7, 7] 1,048,576
BatchNorm2d-122 [-1, 2048, 7, 7] 4,096
Bottleneck-123 [-1, 2048, 7, 7] 0
AdaptiveAvgPool2d-124 [-1, 2048, 1, 1] 0
Linear-125 [-1, 1] 2,049
================================================================
Total params: 23,510,081
Trainable params: 23,510,081
Non-trainable params: 0
----------------------------------------------------------------
Input size (MB): 0.57
Forward/backward pass size (MB): 213.24
Params size (MB): 89.68
Estimated Total Size (MB): 303.50
----------------------------------------------------------------
======================================================== --- Custom Model Pruning Summary (Sparsity Analysis) --- ======================================================== Layer Name | Total Params | Non-Zero Params | Sparsity (%) ------------------------------------------------------------------------------------------------ conv1 | 9,408 | 9,061 | 3.69% layer1.0.conv1 | 4,096 | 3,988 | 2.64% layer1.0.conv2 | 36,864 | 34,037 | 7.67% layer1.0.conv3 | 16,384 | 15,917 | 2.85% layer1.0.shortcut.0 | 16,384 | 15,931 | 2.76% layer1.1.conv1 | 16,384 | 15,563 | 5.01% layer1.1.conv2 | 36,864 | 34,168 | 7.31% layer1.1.conv3 | 16,384 | 15,991 | 2.40% layer1.2.conv1 | 16,384 | 15,521 | 5.27% layer1.2.conv2 | 36,864 | 34,141 | 7.39% layer1.2.conv3 | 16,384 | 15,927 | 2.79% layer2.0.conv1 | 32,768 | 31,090 | 5.12% layer2.0.conv2 | 147,456 | 133,284 | 9.61% layer2.0.conv3 | 65,536 | 62,944 | 3.96% layer2.0.shortcut.0 | 131,072 | 124,089 | 5.33% layer2.1.conv1 | 65,536 | 60,930 | 7.03% layer2.1.conv2 | 147,456 | 133,521 | 9.45% layer2.1.conv3 | 65,536 | 63,078 | 3.75% layer2.2.conv1 | 65,536 | 60,930 | 7.03% layer2.2.conv2 | 147,456 | 133,427 | 9.51% layer2.2.conv3 | 65,536 | 63,141 | 3.65% layer2.3.conv1 | 65,536 | 60,763 | 7.28% layer2.3.conv2 | 147,456 | 133,364 | 9.56% layer2.3.conv3 | 65,536 | 63,106 | 3.71% layer3.0.conv1 | 131,072 | 121,733 | 7.13% layer3.0.conv2 | 589,824 | 524,613 | 11.06% layer3.0.conv3 | 262,144 | 248,231 | 5.31% layer3.0.shortcut.0 | 524,288 | 486,150 | 7.27% layer3.1.conv1 | 262,144 | 238,377 | 9.07% layer3.1.conv2 | 589,824 | 524,757 | 11.03% layer3.1.conv3 | 262,144 | 248,325 | 5.27% layer3.2.conv1 | 262,144 | 238,027 | 9.20% layer3.2.conv2 | 589,824 | 523,957 | 11.17% layer3.2.conv3 | 262,144 | 248,464 | 5.22% layer3.3.conv1 | 262,144 | 237,899 | 9.25% layer3.3.conv2 | 589,824 | 523,521 | 11.24% layer3.3.conv3 | 262,144 | 248,369 | 5.25% layer3.4.conv1 | 262,144 | 237,883 | 9.25% layer3.4.conv2 | 589,824 | 523,266 | 11.28% layer3.4.conv3 | 262,144 | 248,512 | 5.20% layer3.5.conv1 | 262,144 | 237,764 | 9.30% layer3.5.conv2 | 589,824 | 522,539 | 11.41% layer3.5.conv3 | 262,144 | 248,324 | 5.27% layer4.0.conv1 | 524,288 | 474,932 | 9.41% layer4.0.conv2 | 2,359,296 | 2,063,191 | 12.55% layer4.0.conv3 | 1,048,576 | 972,963 | 7.21% layer4.0.shortcut.0 | 2,097,152 | 1,898,677 | 9.46% layer4.1.conv1 | 1,048,576 | 932,613 | 11.06% layer4.1.conv2 | 2,359,296 | 2,063,982 | 12.52% layer4.1.conv3 | 1,048,576 | 972,881 | 7.22% layer4.2.conv1 | 1,048,576 | 932,755 | 11.05% layer4.2.conv2 | 2,359,296 | 2,061,971 | 12.60% layer4.2.conv3 | 1,048,576 | 970,895 | 7.41% linear | 2,048 | 1,781 | 13.04% ------------------------------------------------------------------------------------------------ TOTAL PRUNABLE | 23,456,960 | 21,111,264 | 10.00% ========================================================
Heatmap from weight matrix for sample layer. White pixel represents pruned (zeroed) weights.